Metamodel-Based Hyperparameter Optimization of Optimization Algorithms in Building Energy Optimization

نویسندگان

چکیده

Building energy optimization (BEO) is a promising technique to achieve efficient designs. The efficacy of algorithms imperative for the BEO and significantly dependent on algorithm hyperparameters. Currently, studies focusing hyperparameters are scarce, common agreement how set their values, especially problems, still lacking. This study proposes metamodel-based methodology hyperparameter applied in BEO. aim maximize algorithmic avoid failure because improper settings. method consists three consecutive steps: constructing specific problem, developing an ANN-trained metamodel optimizing with nondominated sorting genetic II (NSGA-II). To verify validity, 15 benchmark problems different properties, i.e., five building models design variable categories, were constructed numerical experiments. For each four commonly used algorithms, (GA), particle swarm (PSO) algorithm, simulated annealing (SA), multi-objective (MOGA), optimized. Results demonstrated that MOGA benefited most from terms quality obtained optimum, while PSO computing time.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

economic optimization and energy consumption in tray dryers

دراین پروژه به بررسی مدل سازی خشک کردن مواد غذایی با استفاده از هوای خشک در خشک کن آزمایشگاهی نوع سینی دار پرداخته شده است. برای آنالیز انتقال رطوبت در طی خشک شدن به طریق جابجایی، یک مدل لایه نازک برای انتقال رطوبت، مبتنی بر معادله نفوذ فیک در نظر گفته شده است که شامل انتقال همزمان جرم و انرژی بین فاز جامد و گاز می باشد. پروفایل دما و رطوبت برای سه نوع ماده غذایی شامل سیب زمینی، سیب و موز در طی...

15 صفحه اول

Applying Model-Based Optimization to Hyperparameter Optimization in Machine Learning

This talk will cover the main components of sequential modelbased optimization algorithms. Algorithms of this kind represent the state-of-the-art for expensive black-box optimization problems and are getting increasingly popular for hyper-parameter optimization of machine learning algorithms, especially on larger data sets. The talk will cover the main components of sequential model-based optim...

متن کامل

Practical Hyperparameter Optimization

Recently, the bandit-based strategy Hyperband (HB) was shown to yield good hyperparameter settings of deep neural networks faster than vanilla Bayesian optimization (BO). However, for larger budgets, HB is limited by its random search component, and BO works better. We propose to combine the benefits of both approaches to obtain a new practical state-of-the-art hyperparameter optimization metho...

متن کامل

Forward and Reverse Gradient-Based Hyperparameter Optimization

We study two procedures (reverse-mode and forward-mode) for computing the gradient of the validation error with respect to the hyperparameters of any iterative learning algorithm such as stochastic gradient descent. These procedures mirror two methods of computing gradients for recurrent neural networks and have different trade-offs in terms of running time and space requirements. Our formulati...

متن کامل

Gradient-based Hyperparameter Optimization through Reversible Learning

Tuning hyperparameters of learning algorithms is hard because gradients are usually unavailable. We compute exact gradients of cross-validation performance with respect to all hyperparameters by chaining derivatives backwards through the entire training procedure. These gradients allow us to optimize thousands of hyperparameters, including step-size and momentum schedules, weight initialization...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Buildings

سال: 2023

ISSN: ['2075-5309']

DOI: https://doi.org/10.3390/buildings13010167